299 research outputs found

    Income and distance elasticities of values of travel time savings: New Swiss results

    Get PDF
    This paper presents the findings of a study looking into the valuation of travel time savings (VTTS) in Switzerland, across modes as well as across purpose groups. The study makes several departures from the usual practice in VTTS studies, with the main one being a direct representation of the income and distance elasticity of the VTTS measures. Here, important gains in model performance and significantly different results are obtained through this approach. Additionally, the analysis shows that the estimation of robust coefficients for congested car travel time is hampered by the low share of congested time in the overall travel time, and the use of an additional rate-of-congestion coefficient, in addition to a generic car travel time coefficient, is preferable. Finally, the analysis demonstrates that the population mean of the indicators calculated is quite different from the sample means and presents methods to calculate those, along with the associated variances. These variances are of great interest as they allow the generation of confidence intervals, which can be extremely useful in cost-benefit analyses

    Capturing trade-offs between daily scheduling choices

    Get PDF
    We propose a new modelling approach for daily activity scheduling which integrates the different daily scheduling choice dimensions (activity participation, location, schedule, duration and transportation mode) into a single optimisation problem. The fundamental behavioural principle behind our approach is that individuals schedule their day to maximise their overall derived utility from the activities they complete, according to their individual needs, constraints, and preferences. By combining multiple choices into a single optimisation problem, our framework is able to capture the complex trade-offs between scheduling decisions for multiple activities. These trade-offs could include how spending longer in one activity will reduce the time-availability for other activities or how the order of activities determines the travel-times. The implemented framework takes as input a set of considered activities, with associated locations and travel modes, and uses these to produce empirical distributions of individual schedules from which different daily schedules can be drawn. The model is illustrated using historic trip diary data from the Swiss Mobility and Transport Microcensus. The results demonstrate the ability of the proposed framework to generate complex and realistic distributions of starting time and duration for different activities within the tight time constraints. The generated schedules are then compared to the aggregate distributions from the historical data to demonstrate the feasibility and flexibility of our approach

    Estimation et prédiction en temps-réel de tables origine-destination

    Get PDF
    Le problème d'estimation de tables origine-destination (OD) à partir de données de comptages est de première importance pour un grand nombre d'applications impliquant la modélisation d'un système de transport. En effet, ces tables appréhendent statistiquement la demande, qui conditionne le fonctionnement de l'ensemble du système. Ce papier à pour but de faire une synthèse des caractéristiques de ce problème, de le décrire sous plusieurs aspects (statique, dynamique, temps-réel) et de présenter des résultats récents liés à sa résolution et proposés par l'auteur

    Assisted specification of discrete choice models

    Get PDF
    Determining appropriate utility specifications for discrete choice models is time-consuming and prone to errors. With the availability of larger and larger datasets, as the number of possible specifications exponentially grows with the number of variables under consideration, the analysts need to spend increasing amounts of time on searching for good models through trial-and-error, while expert knowledge is required to ensure these models are sound. This paper proposes an algorithm that aims at assisting modelers in their search. Our approach translates the task into a multi-objective combinatorial optimization problem and makes use of a variant of the variable neighborhood search algorithm to generate sets of promising model specifications. We apply the algorithm both to semi-synthetic data and to real mode choice datasets as a proof of concept. The results demonstrate its ability to provide relevant insights in reasonable amounts of time so as to effectively assist the modeler in developing interpretable and powerful models

    A general and operational representation of GEV models

    Get PDF
    Generalised Extreme Value models provide an interesting theoret- ical framework to develop closed-form random utility models. Unfor- tunately, few instances of this family are to be found as operational models in practice. The form of the model, based on a generating function G which must satisfy specific properties, is rather compli- cated. Fundamentally, it is not an easy task to translate an intuitive perception of the correlation structure by the modeller into a concrete G function. And even if the modeller succeeds in proposing a new G function, the task of proving that it indeed satises the properties is cumbersome. The main objectives of this paper are (i) to provide a general theo- retical foundation, so that the development of new GEV models will be easier in the future, and (ii) to propose an easy way of generating new GEV models without a need for complicated proofs. Our technique requires only a network structure capturing the underlying correlation of the choice situation under consideration. If the network complies with some simple conditions, we show how to build an associated model. We prove that it is indeed a GEV model and, therefore, complies with random utility theory. The Multinomial Logit, the Nested Logit and the Cross-Nested Logit models are specific instances of our class of models. So are the recent GenL model, combining choice set generation and choice model and some specialised compound mod- els used in recent work. Probability, expected maximum utility and elasticity formulae for the class of models are provided

    A class of multi-iterate methods to solve systems of nonlinear equations

    Get PDF
    A new class of methods for solving systems of nonlinear equations is introduced. The main idea is to build a linear model using a population of previous iterates. Contrarily to classical secant methods, where exact interpolation is used, we prefer a least squares approach to calibrate the linear model. We propose an explicit control of the numerical stability of the method. We show that our approach can lead to an update formula. In that case, we prove the local convergence of the corresponding quasi-Newton method. Finally, computational comparisons with classical methods highlight a significant improvement in terms of robustness and number of function evaluations. We also present preliminary numerical tests showing the robust behavior of our methods in the presence of noise

    PAPABILES Simulation-based evaluation of the impact of telematics in the Lausanne area: a pilot study

    Get PDF
    At the present time, variable-message signs (VMS) and variable speed-limit signs are in frequent use for traffic control, especially along urban motorways. The Lausanne by-pass is partially equipped with these and should be fully equipped in the near future. This study is made to evaluate the effects of such systems, either on the efficiency of the road network or the security of its users. The PAPABILES pilot-study deals with the evaluation of the potential effects of such control systems on the performance and safety of the network, using a stateof- the-art simulation tool (MITSIM), developed at the Massachusetts Institute of Technology. In this paper, we present the scenarios that have been tested and comment on the results. Preliminary analysis of the impact of variable speed limit signs made it possible to emphasize the following elements: a) The reduction of the speed limit in the case of high-flow scenarios did not produce a significant increase in the performance of the motorway network, usually limited to 120 km/h. For limitations lower than 100 km/h, it actually seems to decrease. Admittedly, some results tend to show a slight improvement of the performance for speeds around 105 km/h. However, the magnitude of these improvements is too low to justify the installation of such equipment for the sole purpose of increasing the performance of the network. b) About the question of road-users' safety in the presence of high flow, a lower speed limit decreases the probability and the severity of an accident when the traffic breaks down from a normal regime to a congested regime. As mentioned above, this safety improvement does not significantly affect the system's performance in terms of throughput. c) In the case of an incident that notably reduces the capacity of the motorway, simulations carried out up to now show that the application of various speed limitation scenarios does not improve the performance of the network. The capacity of the network is governed by the capacity at the incident location and the actual speed is already below the limitation. Again, the role of speed limitation is more beneficial for safety than throughput. We emphasize that, due to the limited calibration of the model, the results must be interpreted with care. We believe that their interpretation is valid, but that their actual impact must be analyzed in more detail. This will be achieved in subsequent phases of the project

    Capturing correlation in large-scale route choice models

    Get PDF
    When using random utility models for a route choice problem, choice set generation and correlation among alternatives are two issues that make the modeling complex. In this paper we discuss different models capturing path overlap. First, we analyze several formulations of the Path Size Logit model proposed in the literature and show that the original formulation should be used. Second, we propose a modeling approach where the path overlap is captured with a subnetwork. A subnetwork is a simplification of the road network only containing easy identifiable and behaviorally relevant roads. In practice, the subnetwork can easily be defined based on the route network hierarchy. We propose a model where the subnetwork is used for defining the correlation structure of the choice model. The motivation is to explicitly capture the most important correlation without considerably increasing the model complexity. We present estimation results of a factor analytic specification of a mixture of Multinomial Logit model, where the correlation among paths is captured both by a Path Size attribute and error components. The estimation is based on a GPS dataset collected in the Swedish city of Borlänge. The results show a significant increase in model fit for the Error Component model compared to a Path Size Logit model. Moreover, the correlation parameters are significant
    • …
    corecore